Meta Launches Small Inference Model MobileLLM-R1, Enterprises Shift Towards Small AI
Recently, Meta launched a small inference model called MobileLLM-R1, which has drawn attention to the application of 'small AI' in enterprises. In the past, the powerful capabilities of artificial intelligence models were often associated with their large parameter sizes, with many models having hundreds of billions or even trillions of parameters. However, ultra-large-scale models face many issues when used by enterprises, such as lack of control over underlying systems, reliance on third-party cloud services, and unpredictable costs. To address these pain points, small language models (SLMs) have emerged.